Research Insight
Integrating UAV-based Remote Sensing and Machine Learning to Monitor Rice Growth in Large-scale Fields 


Field Crop, 2025, Vol. 8, No. 4
Received: 17 Jun., 2025 Accepted: 29 Jul., 2025 Published: 20 Aug., 2025
As one of the most important food crops in the world, the production level of rice is directly related to food security and sustainable agricultural development. The growth monitoring of rice in the field environment is confronted with challenges such as wide distribution of fields, significant environmental differences and complex growth processes. Unmanned aerial vehicle (UAV) remote sensing technology, with its high spatial resolution and flexibility, provides a new approach for obtaining growth parameters of rice populations. Meanwhile, machine learning methods have significant advantages in multi-source data processing, feature extraction and pattern recognition. This study explores the integrated framework of unmanned aerial vehicle (UAV) remote sensing and machine learning, summarizes the application characteristics of various sensors (RGB, multispectral, hyperspectral, and thermal infrared), assesses the applicability of vegetation index, canopy structure parameters, and physiological and ecological indicators in rice growth monitoring, and analyzes the performance of models such as random forest, support vector machine, XGBoost, and deep learning The application potential of this technology in the prediction of rice yield and disease monitoring in the Yangtze River Basin of China and Southeast Asia was demonstrated through case studies. The research results show that the combination of UAV and machine learning can effectively achieve precise monitoring of large-scale rice growth, which is of great significance to the development of precision agriculture and smart agriculture. This study aims to construct a monitoring framework integrating unmanned aerial vehicle (UAV) remote sensing and machine learning to achieve dynamic, precise and large-scale assessment of the growth status of rice in the field.
1 Introduction
Rice (Oryza sativa L.) is a very important crop in the world. It provides staple food for more than half of the global population and is also one of the most important grain crops in China. Maintaining high rice yields is crucial for addressing issues such as population growth, climate change, and resource shortages (Yuan et al., 2024; Chen et al., 2025). To achieve this, it is necessary to monitor the growth of rice on a large scale. Only in this way can it be managed better, yields be increased, and sustainable agricultural development be promoted at the same time.
In the past, people mainly monitored the rice in the fields through manual investigation and ground observation. These methods are both slow and tiring, and it is difficult to achieve precision in terms of time and space, often failing to meet the needs of quick decision-making. Although satellite remote sensing can view a large area, its resolution is relatively low and it is easily affected by clouds and the atmosphere, thus failing to obtain details of the fields (Chen et al., 2024). In contrast, unmanned aerial vehicle (UAV) remote sensing has more advantages. It can quickly and flexibly obtain high-resolution data without damaging the plants. Coupled with machine learning algorithms, unmanned aerial vehicles can estimate biomass, nitrogen content, leaf area index (LAI), and yield relatively accurately even in complex field environments (Liu et al., 2023; Sarkar et al., 2023; Du et al., 2024; Ko et al., 2024). When dealing with images, methods such as random forests, support vector machines and deep neural networks usually perform better than traditional regression methods. They can extract more useful information from spectral and texture features (Zha et al., 2020; Bin et al., 2023; Wang et al., 2023).
This study will review the significance of rice in global food security and analyze the difficulties and the latest progress in large-scale monitoring. This paper mainly introduces the combination of unmanned aerial vehicle (UAV) remote sensing and machine learning, and proposes a research framework and objective for precise rice management. When these technologies are combined, they are expected to significantly enhance the accuracy and efficiency of monitoring and also expand the scope of application. Ultimately, these methods can promote sustainable food production and enhance the risk-resistance capacity of agriculture. This study aims to provide theoretical and practical references for establishing a precise and efficient rice monitoring system, and to promote the development of smart agriculture and digital villages.
2 UAV Remote Sensing Platforms and Data Acquisition Techniques
2.1 Advantages and limitations of UAVs in agricultural monitoring
Drones have many advantages in agricultural monitoring. They can offer very high spatial and temporal resolution, fly flexibly, and collect detailed information at different growth stages of crops. Unmanned aerial vehicles (UAVs) collect data quickly, do not damage plants, and have relatively low costs. This is particularly valuable for large-scale rice fields and precise management (Figure 1) (Cen et al., 2019). However, drones also have some shortcomings. For instance, the battery life is limited, making it difficult to fly in windy and rainy weather, and it is also subject to regulatory control. To ensure data quality and coverage, meticulous flight planning and calibration are required (Zha et al., 2020). Meanwhile, unmanned aircraft will generate a large amount of image data, and processing and analyzing it requires strong computing resources and professional skills.
![]() Figure 1 Illustration of the UAV system and radiometric calibration targets (Adopted from Cen et al., 2019) |
2.2 Applications of different sensor types (RGB, multispectral, hyperspectral, thermal infrared)
Different sensors can all be installed on drones, each with its own advantages: RGB camera: Low price, simple operation, capable of taking high-resolution color images for evaluating canopy structure, color and height, but with limited spectral information (Yang et al., 2021). Multispectral camera: It can obtain data in multiple bands such as red border and near-infrared, and can calculate vegetation indices (such as NDVI) for monitoring growth status, nitrogen level and stress conditions (Kumar et al., 2024). Hyperspectral camera: It can provide continuous information of hundreds of bands and is suitable for analyzing traits such as chlorophyll, leaf area index (LAI), and aboveground biomass (Yu et al., 2017; Ban et al., 2022). Thermal infrared camera: Capable of measuring canopy temperature, used to assess moisture conditions, evapotranspiration and irrigation requirements (Wu et al., 2025). The combined use of RGB, multispectral, hyperspectral and thermal imaging can improve the accuracy and stability of rice growth monitoring and stress detection (Xu et al., 2022; Shen et al., 2024; Guo et al., 2025).
2.3 Role of multi-temporal and high-resolution imagery in monitoring rice growth
Obtaining drone images at different growth stages of rice can dynamically track the growth situation. This can also help detect stress earlier and assist in predicting production. High-resolution images can display small-scale differences in the field, facilitating targeted management. If multi-temporal images are combined with machine learning, leaf area index, biomass and yield can be estimated more accurately. Meanwhile, this method can also provide real-time decision support for fertilization, irrigation and pest control (Luo et al., 2022; Chen et al., 2024; Gade et al., 2024).
3 Remote Sensing Characterization of Key Rice Growth Indicators
3.1 Vegetation indices (NDVI, EVI, PRI, etc.) and rice canopy parameters
Vegetation indices (VI) such as NDVI, EVI and PRI can be calculated from multispectral, hyperspectral or RGB images collected by drones. These indices are often used to estimate the canopy parameters of rice, including leaf area index (LAI), chlorophyll content (SPAD), biomass and nitrogen level. Research has found that some indices, especially those containing red edges and near-infrared bands, have a high correlation with LAI and SPAD at different reproductive stages (Zha et al., 2020; Ban et al., 2022). For instance, mND705, SAVI and WDRVI perform well in estimating LAI, while indices such as GLI, RGRI and ExR are more suitable for reflecting biomass and nitrogen content (Prabhakar et al., 2024). If VI is combined with texture or structural features, the monitoring accuracy can be further improved (Lyu, 2024).
3.2 Structural parameters extracted from UAV imagery (plant height, canopy coverage, LAI)
Drone images can directly obtain some structural parameters, such as plant height, canopy coverage and leaf area index (LAI). These indicators are closely related to the growth and yield of rice. Plant height can be calculated from RGB or multispectral images through digital surface models. Coverage and LAI can be estimated by spectral and texture analysis (Duan et al., 2019; Liao et al., 2025). Studies have found that if canopy height and VI are combined, the estimation of LAI will be more accurate, have less error, and can also reduce the impact of phenological changes (Gong et al., 2021). High-resolution unmanned aerial vehicle images can also draw the distribution of these parameters in the field, providing a reference for precise management (Qiu et al., 2020).
3.3 Remote sensing inversion methods for rice physiological and ecological indicators
Nowadays, many studies have begun to use machine learning methods, such as random forests, support vector regression and neural networks, to invert the physiological and ecological indicators of rice. These methods, combining spectral, texture and structural features, can predict LAI, SPAD, biomass, nitrogen nutrient index (NNI) and yield relatively accurately (Liu et al., 2023; Wang et al., 2023). If the data from drones and satellites are combined and deep learning models are used, the prediction accuracy and spatial details will be better. Relatively stable monitoring can be achieved even when the field conditions are very complex (Chen et al., 2024; Li et al., 2024). Therefore, the combination of multi-source and multi-temporal data with advanced modeling is the key to achieving high-precision and large-scale rice monitoring (Li and Jiong, 2024).
4 Applications of Machine Learning in Rice Growth Monitoring
4.1 Suitability analysis of common algorithms (RF, SVM, XGBoost, deep learning)
In rice growth monitoring, machine learning (ML) is often used to analyze the data collected by drones. Among traditional methods, random Forest (RF) and Support Vector Machine (SVM) are the most common. They can handle nonlinear relationships and the results are relatively stable. Research has found that RF is often more accurate than linear regression and other models in estimating nitrogen status, biomass and growth stage (Zha et al., 2020; Qiu et al., 2021). SVM is mostly used for classification, such as geophysical classification and growth stage division, and sometimes can achieve the highest accuracy (Ramadhani et al., 2020; Guo et al., 2021; Fatchurrachman et al., 2023). Extreme gradient boosting (XGBoost) and decision trees also perform well in predicting yield and biomass, especially when dealing with large-scale and multi-source data (Singha and Swain, 2023). Deep learning methods, such as convolutional Neural Networks (CNNS), EfficientNet, ResNet and YOLO, are more suitable for image-based tasks. They are often used to identify growth stages and detect rice panicles, with an accuracy rate of over 95%, and are more reliable than traditional models in complex field environments (Zheng et al., 2025).
4.2 Feature selection, model training, and validation approaches
For a model to perform well, choosing the right input features is crucial. The commonly used features in the research include vegetation index, texture features, structural parameters (such as plant height, canopy coverage), and multi-temporal data (Sheng et al., 2022; Wang et al., 2023). The methods of feature selection include stepwise regression, random forest, and embedding methods in deep learning, all of which can help pick out the most useful variables (Ge et al., 2024). When training the model, it is usually necessary to divide the data into the training set and the validation set, and then use cross-validation and independent test sets to test the generalization ability of the model to avoid overfitting (Lyu et al., 2023). Common evaluation metrics include accuracy, precision, recall, F1 score, as well as R² and RMSE in the regression model (Sheng et al., 2022; Guo et al., 2023).
4.3 Challenges of model generalization and cross-regional transferability
Although these models performed well in experiments, there are still problems when they are extended to different regions, varieties and environmental conditions. For example, sensor Angle, image resolution, plant overlap and field differences can all affect model stability (Zha et al., 2020; Tan et al., 2022; Tseng et al., 2022). In addition, different management methods, soil types and climatic conditions also limit the migration effect of the model. To address these issues, researchers have attempted to incorporate soil, weather and management information using methods such as transfer learning, multi-site training and data fusion (Iatrou et al., 2021). The latest research emphasizes that to enhance the generalization of models, more abundant multi-temporal data and standardized processes are needed, so that the models can be more stable when applied on a large scale.
5 Integration Framework of UAV Remote Sensing and Machine Learning
5.1 Data preprocessing and feature engineering (illumination correction, noise reduction, variable construction)
The first step in integrating unmanned aerial vehicle (UAV) remote sensing with machine learning is to do a good job in data preprocessing and feature engineering. Common operations include: light correction (adapting to different lighting conditions), noise reduction (removing sensor and environmental interference), and geometric and radiative calibration (ensuring data consistency) (Zha et al., 2020; Shen et al., 2024). Afterwards, some variables need to be constructed, such as vegetation index, texture features and structural parameters (plant height, canopy coverage, etc.), which can help capture the relationship between the spectrum and the growth indicators of rice. Feature selection methods (such as recursive feature elimination or the built-in selection mechanism of the model) can identify the variable with the largest amount of information as input.
5.2 Multi-source data fusion methods for model input
Combining drone images (multispectral or hyperspectral), satellite data, meteorological information and soil data can make the model more robust and also improve the prediction accuracy. The fusion methods include: aligning unmanned aerial vehicle (UAV) images with satellite images using scale transformation, or integrating spectral, texture and auxiliary data together to form a comprehensive feature set (Jin et al., 2024). The research found that after combining the data from unmanned aerial vehicles (UAVs) and satellites, the inversion accuracy of indicators such as leaf area index (LAI) and yield was significantly improved. If meteorological and field management data are added, the model effect will be better (Islam et al., 2023; Chen et al., 2024). At present, multi-source feature fusion (such as combining average spectra, vegetation indices and textures) generally performs better than that of a single data source.
5.3 Pipeline design for UAV- and ML-based rice growth monitoring
A typical unmanned aerial vehicle (UAV) + machine learning monitoring process roughly includes the following steps: Data collection: Obtain high-resolution images from unmanned aerial vehicles (UAVs) and collect auxiliary data such as weather, soil, and management (Chen et al., 2024). Preprocessing: Perform illumination correction, noise reduction, calibration and registration on the image. Feature engineering: Extract and screen relevant spectral, texture and structural features. Data Fusion: Integrating Multi-source Data to enrich Model Input (Du et al., 2024). Model training and validation: Trained using machine learning algorithms (such as RF, SVM or ensemble methods), and evaluated through cross-validation and independent test sets (Zha et al., 2020; Islam et al., 2023). Prediction and Mapping: Generate spatial distribution maps of rice growth to facilitate precise management (Shen et al., 2024). This process is modular, capable of supporting scalable, real-time and precise rice monitoring, and can also provide data support for precision agriculture.
6 Case Studies: Monitoring Practices in Large-scale Rice Fields
6.1 UAV monitoring cases in major rice production areas of the middle and lower Yangtze River Basin, China
Unmanned aerial vehicle (UAV) remote sensing is a flexible and efficient technology that can obtain information on farmland environment and crop growth. In recent years, it has been increasingly applied in agricultural production and scientific research. In the research of the Yangtze River Basin, it was found that unmanned aerial vehicle remote sensing has a very good effect on the monitoring and management of rice. For instance, by combining the vegetation index obtained by drones with the depth image features and then analyzing them with a random forest model, it is possible to accurately identify the nutritional deficiencies of rice and formulate reasonable fertilization plans. The classification accuracy rate exceeds 96% (Figure 2) (Chen et al., 2025). In addition, combining drone data with satellite data can also enhance the accuracy of rice growth and pest monitoring. In the demonstration fields near Nanjing, this method is very effective for monitoring pests such as Cnaphalocrocis medinalis. It can also more accurately invert the leaf area index (LAI) and provide clearer spatial information, facilitating precise management (Chen et al., 2024).
![]() Figure 2 Field imagery acquisition and dataset preparation. (A) Specific parameters of the UAV camera and the field flight mission. (B) Acquisition and stitching of field imagery. (C) Calculation and merging of features as well as dataset preparation (Adopted from Chen et al., 2025) |
6.2 Multi-temporal UAV monitoring and yield prediction studies in Southeast Asia
In Southeast Asia, multi-temporal drone monitoring has been widely used to assess rice growth and predict yields. In southern China, researchers used drones equipped with RGB and multispectral cameras to track the entire rice-growing season. They extracted the vegetation index, canopy height and coverage, and combined them with the random forest model to successfully predict the yield. The coefficient of determination R² reaches 0.85 and can be applied across different years and plots (Wan et al., 2020). In Japan, researchers compared the effects of different multispectral cameras of drones and found that indices such as NDVI and VARI could well track growth at different reproductive stages, and the method was economical and efficient (Dimyati et al., 2023). In Southeast Asia, RGB images are also used for early detection and yield estimation, helping farmers take management measures in a timely manner (Sari et al., 2021; Luu et al., 2023).
6.3 Deep learning-based monitoring of rice diseases and precision management practices
The combination of deep learning and unmanned aerial vehicle (UAV) images has brought new methods to the monitoring and precise management of rice diseases. For instance, deep neural networks can extract features from images to identify nutritional deficiencies and guide fertilization. This method has high monitoring accuracy and can also help improve rice growth (Chen et al., 2025). Unmanned aerial vehicle (UAV) monitoring is often combined with multispectral imaging and spatial analysis to map the distribution of pests such as bacterial wilt and stem borer, facilitating zonal management and precise control (Kharim et al., 2022). In addition, deep convolutional neural networks have also been used to analyze drought responses and changes in disease symptoms, providing support for genetic research and the breeding of stress-resistant rice varieties (Jiang et al., 2021).
7 Conclusion and Prospect
Unmanned aerial vehicle (UAV) remote sensing technology can quickly and non-destructively monitor the growth, nutrient status, water pressure and diseases of rice. It can provide high-resolution images at the centimeter level and also collect data flexibly and promptly. This makes up for the deficiencies of traditional investigations and satellite images. Traditional methods are often time-consuming, labor-intensive, sometimes destructive and easily affected by the weather. Monitoring with drones can better achieve precise fertilization and irrigation, detect pests and diseases early, and predict yields. This can not only enhance the efficiency of resource utilization and reduce input costs, but also make rice production more sustainable.
If the data from drones is combined with ground sensors (such as Internet of Things weather stations) and satellite data, multi-scale and multi-source monitoring can be achieved. Drones have higher spatial and temporal accuracy in field management, while satellites can cover larger areas. Through data fusion (such as scale transformation or machine learning methods), leaf area index (LAI) and yield can be estimated more accurately, and field details can also be retained during large-scale monitoring. This approach is conducive to dynamic decision-making and better integrates plot management with regional agricultural policies.
In the future, the development direction of rice agriculture will be to integrate unmanned aerial vehicle remote sensing, machine learning and digital agriculture platforms to establish an intelligent monitoring and decision-making system. With the development of artificial intelligence, big data and cloud computing, these systems will become increasingly easy to use. They can assess crop health in real time, monitor diseases and also manage crops in different zones. This not only enhances the level of precision agriculture, but also reduces labor and costs, and promotes intensive and sustainable production. The current focus of research is to extend the endurance of unmanned aerial vehicles, enhance the sensor carrying capacity and data processing efficiency. At the same time, more stable and scalable models should be developed to enable them to adapt to different environments.
Acknowledgments
We would like to express our gratitude to the reviewers for their valuable feedback, which helped improve the manuscript.
Conflict of Interest Disclosure
The authors affirm that this research was conducted without any commercial or financial relationships that could be construed as a potential conflict of interest.
Ban S., Liu W., Tian M., Wang Q., Yuan T., Chang Q., and Li L., 2022, Rice leaf chlorophyll content estimation using UAV-based spectral images in different regions, Agronomy, 12(11): 2832.
https://doi.org/10.3390/agronomy12112832
Cen H., Wan L., Zhu J., Li Y., Li X., Zhu Y., Weng H., Wu W., Yin W., Xu C., Bao Y., Feng L., Shou J., and He Y., 2019, Dynamic monitoring of biomass of rice under different nitrogen treatments using a lightweight UAV with dual image-frame snapshot cameras, Plant Methods, 15(1): 32.
https://doi.org/10.1186/s13007-019-0418-8
Chen B., Su Q., Li Y., Chen R., Yang W., and Huang C., 2025, Field rice growth monitoring and fertilization management based on UAV spectral and deep image feature fusion, Agronomy, 15(4): 886.
https://doi.org/10.3390/agronomy15040886
Chen C., Bao Y., Zhu F., and Yang R., 2024, Remote sensing monitoring of rice growth under Cnaphalocrocis medinalis (Guenée) damage by integrating satellite and UAV remote sensing data, International Journal of Remote Sensing, 45(3): 772-790.
https://doi.org/10.1080/01431161.2024.2302350
Dimyati M., Supriatna S., Nagasawa R., Pamungkas F., and Pramayuda R., 2023, A comparison of several UAV-based multispectral imageries in monitoring rice paddy (a case study in paddy fields in Tottori prefecture, Japan), ISPRS International Journal of Geo-Information, 12(2): 36.
https://doi.org/10.3390/ijgi12020036
Du X., Zheng L., Zhu J., and He Y., 2024, Enhanced leaf area index estimation in rice by integrating UAV-based multi-source data, Remote Sensing, 16(7): 1138.
https://doi.org/10.3390/rs16071138
Duan B., Liu Y., Gong Y., Peng Y., Wu X., Zhu R., and Fang S., 2019, Remote estimation of rice LAI based on Fourier spectrum texture from UAV image, Plant Methods, 15(1): 124.
https://doi.org/10.1186/s13007-019-0507-8
Fatchurrachman Soh N., Shah R., Giap S., Setiawan B., and Minasny B., 2023, Automated near-real-time mapping and monitoring of rice growth extent and stages in Selangor Malaysia, Remote Sensing Applications: Society and Environment, 31: 100993.
https://doi.org/10.1016/j.rsase.2023.100993
Gade S., Madolli M., García‐Caparrós P., Ullah H., Cha-Um S., Datta A., and Himanshu S., 2024, Advancements in UAV remote sensing for agricultural yield estimation: a systematic comprehensive review of platforms, sensors, and data analytics, Remote Sensing Applications: Society and Environment, 37: 101418.
https://doi.org/10.1016/j.rsase.2024.101418
Ge J., Zhang H., Xu L., Huang W., Jiang J., Song M., Guo Z., and Wang C., 2024, Full cycle rice growth monitoring with dual-pol SAR data and interpretable deep learning, International Journal of Digital Earth, 17(1): 2445639.
https://doi.org/10.1080/17538947.2024.2445639
Gong Y., Yang K., Lin Z., Fang S., Wu X., Zhu R., and Peng Y., 2021, Remote estimation of leaf area index (LAI) with unmanned aerial vehicle (UAV) imaging for different rice cultivars throughout the entire growing season, Plant Methods, 17(1): 88.
https://doi.org/10.1186/s13007-021-00789-4
Guo X., Ou Y., Deng K., Fan X., Gao R., and Zhou Z., 2025, A unmanned aerial vehicle-based image information acquisition technique for the middle and lower sections of rice plants and a predictive algorithm model for pest and disease detection, Agriculture, 15(7): 790.
https://doi.org/10.3390/agriculture15070790
Guo X., Yin J., Li K., Yang J., Zou H., and Yang F., 2023, Fine classification of rice paddy using multitemporal compact polarimetric SAR C band data based on machine learning methods, Frontiers of Earth Science, 18(1): 30-43.
https://doi.org/10.1007/s11707-022-1011-4
Guo Y., Fu Y., Hao F., Zhang X., Wu W., Jin X., Bryant C., and Senthilnath J., 2021, Integrated phenology and climate in rice yields prediction using machine learning methods, Ecological Indicators, 120: 106935.
https://doi.org/10.1016/J.ECOLIND.2020.106935
Iatrou M., Karydas C., Iatrou G., Pitsiorlas I., Aschonitis V., Raptis I., Mpetas S., Kravvas K., and Mourelatos S., 2021, Topdressing nitrogen demand prediction in rice crop using machine learning systems, Agriculture, 11(4): 312.
https://doi.org/10.3390/AGRICULTURE11040312
Islam M., Di L., Qamer F., Shrestha S., Guo L., Lin L., Mayer T., and Phalke A., 2023, Rapid rice yield estimation using integrated remote sensing and meteorological data and machine learning, Remote Sensing, 15(9): 2374.
https://doi.org/10.3390/rs15092374
Jiang Z., Tu H., Bai B., Yang C., Zhao B., Guo Z., Liu Q., Zhao H., Yang W., Xiong L., and Zhang J., 2021, Combining UAV-RGB high-throughput field phenotyping and genome-wide association study to reveal genetic variation of rice germplasms in dynamic response to drought stress, New Phytologist, 232(1): 440-455.
https://doi.org/10.1111/nph.17580
Jin Z., Guo S., Li S., Yu F., and Xu T., 2024, Research on the rice fertiliser decision-making method based on UAV remote sensing data assimilation, Computers and Electronics in Agriculture, 216: 108508.
https://doi.org/10.1016/j.compag.2023.108508
Kharim M., Wayayok A., Abdullah A., Shariff A., Husin E., and Mahadi M., 2022, Predictive zoning of pest and disease infestations in rice field based on UAV aerial imagery, The Egyptian Journal of Remote Sensing and Space Science, 25(3): 831-840.
https://doi.org/10.1016/j.ejrs.2022.08.001
Ko J., Shin T., Kang J., Baek J., and Sang W., 2024, Combining machine learning and remote sensing-integrated crop modeling for rice and soybean crop simulation, Frontiers in Plant Science, 15: 1320969.
https://doi.org/10.3389/fpls.2024.1320969
Kumar M., Bhattacharya B., Pandya M., and Handique B., 2024, Machine learning based plot level rice lodging assessment using multi-spectral UAV remote sensing, Computers and Electronics in Agriculture, 219: 108754.
https://doi.org/10.1016/j.compag.2024.108754
Li J.Q., and Jiong F., 2024, Genomic diversity and evolutionary mechanisms in the Oryza genus: a comparative analysis, Genomics and Applied Biology, 15(1): 54-63.
https://doi.org/10.5376/gab.2024.15.0008
Li J., Lu J., Fu H., Zou W., Zhang W., Yu W., and Feng Y., 2024, Research on the inversion of key growth parameters of rice based on multisource remote sensing data and deep learning, Agriculture, 14(12): 2326.
https://doi.org/10.3390/agriculture14122326
Liao M., Wang Y., Chu N., Li S., Zhang Y., and Lin D., 2025, Mature rice biomass estimation using UAV-derived RGB vegetation indices and growth parameters, Sensors, 25(9): 2798.
https://doi.org/10.3390/s25092798
Liu S., Zhang B., Yang W., Chen T., Zhang H., Lin Y., Tan J., Li X., Gao Y., Yao S., Lan Y., and Zhang L., 2023, Quantification of physiological parameters of rice varieties based on multi-spectral remote sensing and machine learning models, Remote Sensing, 15(2): 453.
https://doi.org/10.3390/rs15020453
Luo S., Jiang X., Jiao W., Yang K., Li Y., and Fang S., 2022, Remotely sensed prediction of rice yield at different growth durations using UAV multispectral imagery, Agriculture, 12(9): 1447.
https://doi.org/10.3390/agriculture12091447
Luu T., Tam N., Phuc P., Nguyen H., Van Le L., and Ngo Q., 2023, Evaluation of land roughness and weather effects on paddy field using cameras mounted on drone: a comprehensive analysis from early to mid-growth stages, Journal of King Saud University-Computer and Information Sciences, 35(10): 101853.
https://doi.org/10.1016/j.jksuci.2023.101853
Lyu J., 2024, High yield strategies in rice cultivation: agronomic practices and innovations, Bioscience Evidence, 14(6): 270-280.
https://doi.org/10.5376/be.2024.14.0028
Lyu M., Lu X., Shen Y., Tan Y., Wan L., Shu Q., He Y., He Y., and Cen H., 2023, UAV time-series imagery with novel machine learning to estimate heading dates of rice accessions for breeding, Agricultural and Forest Meteorology, 341: 109646.
https://doi.org/10.1016/j.agrformet.2023.109646
Ma B., Cao G., Hu C., and Chen C., 2023, Monitoring the rice panicle blast control period based on UAV multispectral remote sensing and machine learning, Land, 12(2): 469.
https://doi.org/10.3390/land12020469
Prabhakar M., Gopinath K., Kumar N., Thirupathi M., Sravan U., Kumar G., Siva G., Chandana P., and Singh V., 2024, Mapping leaf area index at various rice growth stages in Southern India using airborne hyperspectral remote sensing, Remote Sensing, 16(6): 954.
https://doi.org/10.3390/rs16060954
Qiu Z., Ma F., Li Z., Xu X., Ge H., and Du C., 2021, Estimation of nitrogen nutrition index in rice from UAV RGB images coupled with machine learning algorithms, Computers and Electronics in Agriculture, 189: 106421.
https://doi.org/10.1016/j.compag.2021.106421
Qiu Z., Xiang H., Ma F., and Du C., 2020, Qualifications of rice growth indicators optimized at different growth stages using unmanned aerial vehicle digital imagery, Remote Sensing, 12(19): 3228.
https://doi.org/10.3390/rs12193228
Ramadhani F., Pullanagari R., Kereszturi G., and Procter J., 2020, Mapping of rice growth phases and bare land using Landsat-8 OLI with machine learning, International Journal of Remote Sensing, 41(21): 8428-8452.
https://doi.org/10.1080/01431161.2020.1779378
Sari M., Hassim Y., Hidayat R., and Ahmad A., 2021, Monitoring rice crop and paddy field condition using UAV RGB imagery, JOIV: International Journal on Informatics Visualization, 5(4): 469-474.
https://doi.org/10.30630/joiv.5.4.742
Sarkar T., Roy D., Kang Y., Jun S., Park J., and Ryu C., 2023, Ensemble of machine learning algorithms for rice grain yield prediction using UAV-based remote sensing, Journal of Biosystems Engineering, 49(1): 1-19.
https://doi.org/10.1007/s42853-023-00209-6
Shen Y., Yan Z., Yang Y., Tang W., Sun J., and Zhang Y., 2024, Application of UAV-Borne visible-infared pushbroom imaging hyperspectral for rice yield estimation using feature selection regression methods, Sustainability, 16(2): 632.
https://doi.org/10.3390/su16020632
Sheng R., Huang Y., Chan P., Bhat S., Wu Y., and Huang N., 2022, Rice growth stage classification via RF-based machine learning and image processing, Agriculture, 12(12): 2137.
https://doi.org/10.3390/agriculture12122137
Singha C., and Swain K., 2023, Rice crop growth monitoring with sentinel 1 SAR data using machine learning models in google earth engine cloud, Remote Sensing Applications: Society and Environment, 32: 101029.
https://doi.org/10.1016/j.rsase.2023.101029
Tan S., Liu J., Lu H., Lan M., Yu J., Liao G., Wang Y., Li Z., Qi L., and Ma X., 2022, Machine learning approaches for rice seedling growth stages detection, Frontiers in Plant Science, 13: 914771.
https://doi.org/10.3389/fpls.2022.914771
Tseng H., Yang M., Saminathan R., Hsu Y., Yang C., and Wu D., 2022, Rice seedling detection in UAV images using transfer learning and machine learning, Remote Sensing, 14(12): 2837.
https://doi.org/10.3390/rs14122837
Wan L., Cen H., Zhu J., Zhang J., Zhu Y., Sun D., Du X., Li Z., Weng H., Li Y., Li X., Bao Y., Shou J., and He Y., 2020, Grain yield prediction of rice using multi-temporal UAV-based RGB and multispectral images and model transfer - a case study of small farmlands in the South of China, Agricultural and Forest Meteorology, 291: 108096.
https://doi.org/10.1016/j.agrformet.2020.108096
Wang D., Li R., Liu T., Liu S., Sun C., and Guo W., 2023, Combining vegetation, color, and texture indices with hyperspectral parameters using machine-learning methods to estimate nitrogen concentration in rice stems and leaves, Field Crops Research, 304: 109175.
https://doi.org/10.1016/j.fcr.2023.109175
Wang Y., Tan S., Jia X., Qi L., Liu S., Lu H., Wang C., Liu W., Zhao X., He L., Chen J., Yang C., Wang X., Chen J., Qin Y., Yu J., and Ma X., 2023, Estimating relative chlorophyll content in rice leaves using unmanned aerial vehicle multi-spectral images and spectral-textural analysis, Agronomy, 13(6): 1541.
https://doi.org/10.3390/agronomy13061541
Wu T., Liu K., Cheng M., Gu Z., Guo W., and Jiao X., 2025, Paddy field scale evapotranspiration estimation based on two-source energy balance model with energy flux constraints and UAV multimodal data, Remote Sensing, 17(10): 1662.
https://doi.org/10.3390/rs17101662
Xu T., Wang F., Xie L., Yao X., Zheng J., Li J., and Chen S., 2022, Integrating the textural and spectral information of UAV hyperspectral images for the improved estimation of rice aboveground biomass, Remote Sensing, 14(11): 2534.
https://doi.org/10.3390/rs14112534
Yang M., Tseng H., Hsu Y., Yang C., Lai M., and Wu D., 2021, A UAV open dataset of rice paddies for deep learning practice, Remote Sensing, 13(7): 1358.
https://doi.org/10.3390/rs13071358
Yu F., Xu T., Wen D., Hang M., Zhang G., and Chen C., 2017, Radiative transfer models (RTMs) for field phenotyping inversion of rice based on UAV hyperspectral remote sensing, International Journal of Agricultural and Biological Engineering, 10(4): 150-157.
https://doi.org/10.25165/IJABE.V10I4.3076
Yuan J., Zhang Y., Zheng Z., Yao W., Wang W., and Guo L., 2024, Grain crop yield prediction using machine learning based on UAV remote sensing: a systematic literature review, Drones, 8(10): 559.
https://doi.org/10.3390/drones8100559
Zha H., Miao Y., Wang T., Li Y., Zhang J., Sun W., Feng Z., and Kusnierek K., 2020, Improving unmanned aerial vehicle remote sensing-based rice nitrogen nutrition index prediction with machine learning, Remote Sensing, 12(2): 215.
https://doi.org/10.3390/rs12020215
Zheng H., Liu C., Zhong L., Wang J., Huang J., Lin F., Ma X., and Tan S., 2025, An android-smartphone application for rice panicle detection and rice growth stage recognition using a lightweight YOLO network, Frontiers in Plant Science, 16: 1561632.
https://doi.org/10.3389/fpls.2025.1561632
. HTML
Associated material
. Readers' comments
Other articles by authors
. Deshan Huang

. Yuandong Hong

. Jianquan Li

Related articles
. Rice

. Unmanned aerial vehicle remote sensing

. Machine learning

. Growth monitoring

. Precision agriculture

Tools
. Post a comment